Show your forecasting skills in Good Judgment Open

0

Good Judgment® Open

Ever wondered how good you are at forecasting? As a business forecaster, you can do the usual comparison against a naive model (and hopefully you are beating it!). You might also compare your forecast accuracy to published industry benchmarks -- although I would strongly recommend against this. (See the section below for reasons why.)

But if you're willing to test your performance in a big pond, against the big fish, in a wide variety of forecasting challenges, sign up for Good Judgment Open. Brought to you by the authors of Superforecasting, here is the description:

Superforecasting book coverStart keeping score.

Are you a Superforecaster®? Can you predict the future better than the pundits in the media? Join Good Judgment Open, the site for serious forecasting.

GJ Open is more than a game. It’s the best possible place to hone your forecasting skills. We’ve designed GJ Open specifically to help you improve your forecasting abilities. Make a forecast, explain your reasoning (and be challenged by others), and find out how you stack up against the crowd.

From the future of US politics to international affairs, from technology to sports and entertainment, there's bound to be something in your wheelhouse.

Still not sure? Check out our active challenges, our featured questions, or browse a list of all questions on GJ Open.

In the book you can find out more about The Good Judgment Project, and learn what it takes to be a recognized superforecaster.

Why Not to Compare to Industry Benchmarks

The BFD has previously discussed this issue in The Perils of Forecasting Benchmarks and More on Forecasting Benchmarks.

My argument against benchmarks is based on trustworthiness of the data, consistency of measurement across benchmark participants, and most important, the relevance of comparing performance between organizations that probably have different levels of forecastability of their data. I suspect that the "best in class" forecasters do so because they have the easiest to forecast demand -- not necessarily because they have the most admirable forecasting processes.

For a thorough and definitive discussion of the topic of benchmarking, see Stephan Kolassa's article "Can We Obtain Valid Benchmarks from Published Surveys of Forecast Accuracy," originally published in Foresight (Fall 2008), and reprinted in Business Forecasting: Practical Problems and Solutions.

Share

About Author

Mike Gilliland

Product Marketing Manager

Michael Gilliland is a longtime business forecasting practitioner and formerly a Product Marketing Manager for SAS Forecasting. He is on the Board of Directors of the International Institute of Forecasters, and is Associate Editor of their practitioner journal Foresight: The International Journal of Applied Forecasting. Mike is author of The Business Forecasting Deal (Wiley, 2010) and former editor of the free e-book Forecasting with SAS: Special Collection (SAS Press, 2020). He is principal editor of Business Forecasting: Practical Problems and Solutions (Wiley, 2015) and Business Forecasting: The Emerging Role of Artificial Intelligence and Machine Learning (Wiley, 2021). In 2017 Mike received the Institute of Business Forecasting's Lifetime Achievement Award. In 2021 his paper "FVA: A Reality Check on Forecasting Practices" was inducted into the Foresight Hall of Fame. Mike initiated The Business Forecasting Deal blog in 2009 to help expose the seamy underbelly of forecasting practice, and to provide practical solutions to its most vexing problems.

Comments are closed.

Back to Top